33 research outputs found

    Selbständigkeit von Personen mit Migrationshintergrund in Deutschland: Ursachen ethnischer Unternehmung

    Get PDF
    ökonomische Selbständigkeit, ethnische Ungleichheit

    Early and Late Participation during the Field Period: Response Timing in a Mixed-Mode Probability-Based Panel Survey

    Full text link
    Reluctance of respondents to participate in surveys has long drawn the attention of survey researchers. Yet, little is known about what drives a respondent's decision to answer the survey invitation early or late during the field period. Moreover, we still lack evidence on response timing in longitudinal surveys. That is, the questions on whether response timing is a rather stable respondent characteristic and what - if anything - affects change in response timing across different interviews remain open. We relied on data from a mixed-mode general population panel survey collected between 2014 and 2016 to study the stability of response timing across 18 panel waves and factors that influence the decision to participate early or late in the field period. Our results suggest that the factors which had effects on response timing are different in the mail and web modes. Moreover, we found that experience with prior panel waves affected the respondent's decision to participate early or late. Overall, the present study advocates understanding response timing as a metric variable and, consequently, the need to reflect this in modeling strategies

    Risk of Nonresponse Bias and the Length of the Field Period in a Mixed-Mode General Population Panel

    Get PDF
    Survey researchers are often confronted with the question of how long to set the length of the field period. Longer fielding time might lead to greater participation yet requires survey managers to devote more of their time to data collection efforts. With the aim of facilitating the decision about the length of the field period, we investigated whether a longer fielding time reduces the risk of nonresponse bias to judge whether field periods can be ended earlier without endangering the performance of the survey. By using data from six waves of a probability-based mixed-mode (online and mail) panel of the German population, we analyzed whether the risk of nonresponse bias decreases over the field period by investigating how day-by-day coefficients of variation develop during the field period. We then determined the optimal cut-off points for each mode after which data collection can be terminated without increasing the risk of nonresponse bias and found that the optimal cut-off points differ by mode. Our study complements prior research by shifting the perspective in the investigation of the risk of nonresponse bias to panel data as well as to mixed-mode surveys, in particular. Our proposed method of using coefficients of variation to assess whether the risk of nonresponse bias decreases significantly with each additional day of fieldwork can aid survey practitioners in finding the optimal field period for their mixed-mode surveys

    Survey Data Documentation

    Get PDF
    Documentation of research results is an essential process within the research lifecycle, which includes the steps of study planning and developing the survey instruments, data collection and preparation, data analysis, and data archiving. Primary researchers have to ensure that the collected data and all accompanying materials are properly documented and archived. This enables the scientific community to understand and reproduce the results of a scientific project. The purpose of this survey guideline is to provide a brief introduction and an overview about data preparation and data documentation in order to help primary researchers to make their data and other study-related materials long-term accessible. This overview will therefore help researchers to comply with the principles of reproducibility as a crucial aspect of good scientific practice. This guideline will be useful for researchers who are in the stages of planning a study as well as for those who have already collected data and would like to prepare it for archiving

    The effects of questionnaire completion using mobile devices on data quality: evidence from a probability-based general population panel

    Get PDF
    "The use of mobile devices such as smartphones and tablets for survey completion is growing rapidly, raising concerns regarding data quality in general, and nonresponse and measurement error in particular. We use the data from six online waves of the GESIS Panel, a probability-based mixed-mode panel representative of the German population to study whether the responses provided using tablets or smartphones differ on indicators of measurement and nonresponse errors from responses provided via personal computers or laptops. We follow an approach chosen by Lugtig and Toepoel (2015), using the following indicators of nonresponse error: item nonresponse, providing an answer to an open question; and the following indicators of measurement error: straightlining, number of characters in open questions, choice of left-aligned options in horizontal scales, and survey duration. Moreover, we extend the scope of past research by exploring whether data quality is a function of device-type or respondent-type characteristics using multilevel models. Overall, we find that responding with mobile devices is associated with a higher likelihood of measurement discrepancies compared to PC/laptop survey completion. For smartphone survey completion, the indicators of measurement and nonresponse error tend to be higher than for tablet completion. We find that most indicators of nonresponse and measurement error used in our analysis cannot be attributed to the respondent characteristics but are rather effects of mobile devices." (author's abstract

    Adapting Surveys to the Modern World:Comparing a Research Messenger Design to a Regular Responsive Design for Online Surveys

    Get PDF
    Online surveys are increasingly completed on smartphones. There are several ways to structure online surveys so as to create an optimal experience for any screen size. For example, communicating through applications (apps) such as WhatsApp and Snapchat closely resembles natural turn-by-turn conversations between individuals. Web surveys currently mimic the design of paper questionnaires mostly, leading to a survey experience that may not be optimal when completed on smartphones. In this paper, we compare a research messenger design, which mimics a messenger app type of communication, to a responsive survey design. We investigate whether response quality is similar between the two designs and whether respondents' satisfaction with the survey is higher for either version. Our results show no differences for primacy effects, number of nonsubstantive answers, and dropout rate. The length of open-ended answers was shorter for the research messenger survey compared to the responsive design, and the overall time of completion was longer in the research messenger survey. The evaluation at the end of the survey showed no clear indication that respondents liked the research messenger survey more than the responsive design. Future research should focus on how to optimally design online mixed-device surveys in order to increase respondent satisfaction and data quality

    Understanding Willingness to Share Smartphone-Sensor Data

    Get PDF
    The growing smartphone penetration and the integration of smartphones into people’s everyday practices offer researchers opportunities to augment survey measurement with smartphone-sensor measurement or to replace self-reports. Potential benefits include lower measurement error, a widening of research questions, collection of in situ data, and a lowered respondent burden. However, privacy considerations and other concerns may lead to nonparticipation. To date, little is known about the mechanisms of willingness to share sensor data by the general population, and no evidence is available concerning the stability of willingness. The present study focuses on survey respondents’ willingness to share data collected using smartphone sensors (GPS, camera, and wearables) in a probability-based online panel of the general population of the Netherlands. A randomized experiment varied study sponsor, framing of the request, the emphasis on control over the data collection process, and assurance of privacy and confidentiality. Respondents were asked repeatedly about their willingness to share the data collected using smartphone sensors, with varying periods before the second request. Willingness to participate in sensor-based data collection varied by the type of sensor, study sponsor, order of the request, respondent’s familiarity with the device, previous experience with participating in research involving smartphone sensors, and privacy concerns. Willingness increased when respondents were asked repeatedly and varied by sensor and task. The timing of the repeated request, one month or six months after the initial request, did not have a significant effect on willingness

    Onlinebefragungen auf mobilen Endgeräten: Potentiale und Herausforderungen

    Full text link
    Zentraler Vorteil von Onlinebefragungen auf mobilen Endgeräten (Tablet, Smartphone) ist ihre Allgegenwart und ihr technologisches Potenzial. Umfragemethodisch sind solche Befragungen jedoch eine Herausforderung und die Konsequenzen für die Datenqualität nicht ignorierbar
    corecore